natural world
What technology takes from us – and how to take it back
Decisions outsourced, chatbots for friends, the natural world an afterthought: Silicon Valley is giving us life void of connection. There is a way out - but it's going to take collective effort Summer after summer, I used to descend into a creek that had carved a deep bed shaded by trees and lined with blackberry bushes whose long thorny canes arced down from the banks, dripping with sprays of fruit. Down in that creek, I'd spend hours picking until I had a few gallons of berries, until my hands and wrists were covered in scratches from the thorns and stained purple from the juice, until the tranquillity of that place had soaked into me. The berries on a single spray might range from green through shades of red to the darkness that gives the fruit its name. Partly by sight and partly by touch, I determined which berries were too hard and which too soft, picking only the ones in between, while listening to birds and the hum of bees, to the music of water flowing, noticing small jewel-like insects among the berries, dragonflies in the open air, water striders in the creek's calm stretches. I went there for berries, but I also went there for the quiet, the calm, the feeling of cool water on my feet and sometimes up to my knees as I waded in where the picking was good. At home I made jars of jam. When I gave them away I was trying to give not just my jam - which was admittedly runny and seedy - but something of the peace of that creek, of summer itself.
- Oceania > Australia (0.04)
- North America > United States > Texas (0.04)
- North America > United States > New York (0.04)
- (4 more...)
- Health & Medicine (0.93)
- Leisure & Entertainment > Sports (0.68)
- Information Technology > Communications > Social Media (0.94)
- Information Technology > Artificial Intelligence > Natural Language > Chatbot (0.36)
Should We Look on New Technologies with Awe and Dread?
Should We Look on New Technologies with Awe and Dread? The technological sublime helps us grasp the power of what we're creating--but at a cost. One of the most famous cuts in cinema history, from "2001: A Space Odyssey," perfectly captures a concept known as the technological sublime. First, we see an angry ape bludgeoning one of his fellows to death with a scavenged bone; he's only just discovered that bones can be used this way, and he hurls his weapon into the air in celebration. We follow the bone upward as it tumbles against the unpolluted blue sky.
- North America > United States > New York (0.05)
- South America (0.04)
- North America > United States > Nevada (0.04)
- (7 more...)
- Leisure & Entertainment (0.67)
- Government > Military (0.47)
- Information Technology > Communications > Mobile (0.47)
- Information Technology > Artificial Intelligence > Science Fiction (0.34)
The Machines Finding Life That Humans Can't See
A suite of technologies are helping taxonomists speed up species identification. Listen to more stories on the Noa app. Across a Swiss meadow and into its forested edges, the drone dragged a jumbo-size cotton swab from a 13-foot tether. Along its path, the moistened swab collected scraps of life: some combination of sloughed skin and hair; mucus, saliva, and blood splatters; pollen flecks and fungal spores. Later, biologists used a sequencer about the size of a phone to stream the landscape's DNA into code, revealing dozens upon dozens of species, some endangered, some invasive.
- Europe > Italy (0.05)
- South America > Colombia (0.05)
- North America > Canada > Quebec > Montreal (0.05)
- (2 more...)
AI gives voice to dead animals in Cambridge exhibition
If the pickled bodies, partial skeletons and stuffed carcasses that fill museums seem a little, well, quiet, fear not. In the latest coup for artificial intelligence, dead animals are to receive a new lease of life to share their stories – and even their experiences of the afterlife. More than a dozen exhibits, ranging from an American cockroach and the remnants of a dodo, to a stuffed red panda and a fin whale skeleton, will be granted the gift of conversation on Tuesday for a month-long project at Cambridge University's Museum of Zoology. Equipped with personalities and accents, the dead creatures and models can converse by voice or text through visitors' mobile phones. The technology allows the animals to describe their time on Earth and the challenges they faced, in the hope of reversing apathy towards the biodiversity crisis.
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.26)
- Africa > Mauritius (0.06)
We built an AI tool to help set priorities for conservation in Madagascar: what we found
Artificial Intelligence (AI) – models that process large and diverse datasets and make predictions from them – can have many uses in nature conservation, such as remote monitoring (like the use of camera traps to study animals or plants) or data analysis. Some of these are controversial because AI can be trained to be biased, but others are valuable research tools. Biologist Daniele Silvestro has developed an AI tool that can help identify conservation and restoration priorities. We asked him to tell us more about how it works and what it offers. Artificial intelligence (AI) is a term indicating a broad family of models used to process large and diverse datasets and make predictions from them. We built a model using biodiversity datasets as well as socioeconomic data.
- Africa > Madagascar (0.45)
- Europe > Switzerland > Fribourg > Fribourg (0.07)
#ICLR2023 invited talks: exploring artificial biodiversity, and systematic deviations for trustworthy AI
The 11th International Conference on Learning Representations (ICLR) is taking place this week in Kigali, Rwanda, the first time a major AI conference has taken place in-person in Africa. The program includes workshops, contributed talks, affinity group events, and socials. In addition, a total of six invited talks covered a broad range of topics. In this post we give a flavour of the first two of these presentations. Sofia Crepso is an artist who explores the interaction between biological systems and AI.
The Effect of Epigenetic Blocking on Dynamic Multi-Objective Optimisation Problems
Yuen, Sizhe, Ezard, Thomas H. G., Sobey, Adam J.
Hundreds of Evolutionary Computation approaches have been reported. From an evolutionary perspective they focus on two fundamental mechanisms: cultural inheritance in Swarm Intelligence and genetic inheritance in Evolutionary Algorithms. Contemporary evolutionary biology looks beyond genetic inheritance, proposing a so-called ``Extended Evolutionary Synthesis''. Many concepts from the Extended Evolutionary Synthesis have been left out of Evolutionary Computation as interest has moved toward specific implementations of the same general mechanisms. One such concept is epigenetic inheritance, which is increasingly considered central to evolutionary thinking. Epigenetic mechanisms allow quick non- or partially-genetic adaptations to environmental changes. Dynamic multi-objective optimisation problems represent similar circumstances to the natural world where fitness can be determined by multiple objectives (traits), and the environment is constantly changing. This paper asks if the advantages that epigenetic inheritance provide in the natural world are replicated in dynamic multi-objective optimisation problems. Specifically, an epigenetic blocking mechanism is applied to a state-of-the-art multi-objective genetic algorithm, MOEA/D-DE, and its performance is compared on three sets of dynamic test functions, FDA, JY, and UDF. The mechanism shows improved performance on 12 of the 16 test problems, providing initial evidence that more algorithms should explore the wealth of epigenetic mechanisms seen in the natural world.
- North America > United States > Massachusetts > Suffolk County > Boston (0.05)
- Europe > United Kingdom > England > Hampshire > Southampton (0.05)
- North America > United States > New York > New York County > New York City (0.04)
- (3 more...)
How artificial intelligence helps 2 environmental scientists unlock the natural world's mysteries > News > USC Dornsife
Machine learning is a very specific form of artificial intelligence. Through algorithms designed to learn from experience, machine learning -- also known as ML -- adapts and grows in efficiency over time as more data is added. The ML-driven program "learns" from its mistakes, and in doing so can reduce the time it takes to analyze mountains of data from years to minutes. Melissa Guzman and Sam Silva are using machine learning to find insights into patterns underlying the natural world. Two recently hired faculty members, Melissa Guzman, Gabilan Assistant Professor of Biological Sciences, and Sam Silva, assistant professor of Earth sciences, both at at the USC Dornsife College of Letters, Arts and Sciences, are already garnering attention for their usage of machine learning to find insights into the seemingly unknowable -- the patterns underlying the natural world.
Scientists build first self-powered 'liquibots' that run continuously without electricity
Inspired by water-walking insects, scientists have built liquid robots that work autonomously and continuously without the need for electrical inputs, transporting chemicals back and forth while partially submerged in solution. The "liquibot" technology may lead to further developments in automated chemical synthesis or drug delivery systems for pharmaceuticals, say the researchers, including those from Lawrence Berkeley National Laboratory in the US. Earlier studies had demonstrated the working of liquibots that autonomously perform a task, but just once, and some that can perform a task continuously, but need electricity to do so continuously. In the new research, published in the journal Nature Chemistry, scientists demonstrated the first self-powered liquid robot – which look like little open sacks just 2mm in diameter – that can run continuously on energy from the chemicals in its surroundings instead of electricity. "We have broken a barrier in designing a liquid robotic system that can operate autonomously by using chemistry to control an object's buoyancy," study co-author Tom Russell from Berkeley Lab's Materials Sciences Division said in a statement.
Ecology in the age of automation
The accelerating pace of global change is driving a biodiversity extinction crisis ([ 1 ][1]) and is outstripping our ability to track, monitor, and understand ecosystems, which is traditionally the job of ecologists. Ecological research is an intensive, field-based enterprise that relies on the skills of trained observers. This process is both time-consuming and expensive, thus limiting the resolution and extent of our knowledge of the natural world. Although technology will never replace the intuition and breadth of skills of the experienced naturalist ([ 2 ][2]), ecologists cannot ignore the potential to greatly expand the scale of our studies through automation. The capacity to automate biodiversity sampling is being driven by three ongoing technological developments: the commoditization of small, low-power computing devices; advances in wireless communications; and an explosion in automated data-recognition algorithms in the field of machine learning. Automated data collection and machine learning are set to revolutionize in situ studies of natural systems. Automation has swept across all human endeavors over recent decades, and science is no exception. The extent of ecological observation has traditionally been limited by the costs of manual data collection. We envision a future in which data from field studies are augmented with continuous, fine-scale, remotely sensed data recording the presence, behavior, and other properties of individual organisms. As automation drives down costs of these networks, there will not be a simple expansion of the quantity of data. Rather, the potential high resolution and broad extent of these data will lead to qualitatively new findings and will result in new discoveries about the natural world that will enable ecologists to better predict and manage changing ecosystems ([ 3 ][3]). This will be especially true as different types of sensing networks, including mobile elements such as drones, are connected together to provide a rich, multidimensional view of nature. Given the role that biodiversity plays in lending resilience to the ecosystems on which humans depend ([ 4 ][4]), monitoring the distribution and abundance of species along with climate and other variables is a critical need in developing ecological hypotheses and for adapting to emerging global challenges. Ecosystems are alive with sound and motion that can be captured with audio and video sensors. Rapid advances in audio and video classification algorithms can allow the recognition of species and labeling of complex traits and behaviors, which were traditionally the domain of manual species identification by experts. The major advance has been the discovery of deep convolutional neural networks ([ 5 ][5]). These algorithms extract fundamental aspects of contrast and shape in a manner analogous to how we and other animals recognize objects in our visual field. Applied to audio signals, these neural networks are highly effective at classifying natural and anthropogenic sounds ([ 6 ][6]). A canonical example is the classification of bird songs. Other acoustic examples include insects, amphibians, and disturbance indicators such as chainsaws. Naturally, these algorithms also lend themselves to species identification from images and videos. In cases of animals displaying complex color patterns, individuals may be distinguished, allowing minimally invasive mark recapture, an important tool in population studies and conservation ([ 7 ][7]). Beyond sight and sound, sensors can target a wide range of physical, chemical, and biological phenomena. Particularly intriguing is the possibility for widespread environmental sensing of biomolecular compounds that could, for example, allow quantification of “DNA-scapes” by means of laboratory-on-a-chip–type sensors ([ 8 ][8]). Several technological trends are shaping the emergence of large-scale sensor networks. One is the ongoing miniaturization of technology, allowing deployment of extended arrays of low-power sensor devices across landscapes [for example, ([ 9 ][9])]. In many cases, these can be solar-powered in remote locations. The widespread availability of computer-on-a-chip devices along with various attached sensors is enabling the construction of large distributed sensing networks at price points that were formerly unattainable. Similarly, the ubiquitous availability of cloud-based computing and storage for back-end processing is facilitating large-scale deployments. Another trend is advancements in wireless communications. For example, the emerging internet of things ([ 10 ][10]) enables low-power devices to establish ad hoc mesh networks that can pass information from node to node, eventually reaching points of aggregation and analysis. The same technology used to connect smart doorbells and lightbulbs can be leveraged to move data across sensor networks distributed across a landscape. These protocols are designed for low power consumption but may not have sufficient bandwidth for all applications. An alternative, although more power hungry, is cellular technology, which has increasing coverage globally. In remote locations, where commercial cellular data services may not be available, researchers can consider a private cellular network for on-site telemetry and satellite uplinks for internet streaming. However, in the near term, telecommunications costs and per-device power requirements may nonetheless prove prohibitive in certain high-bandwidth applications, such as video and audio streaming. An alternative for sites where communications bandwidth is limited by cost, isolation, or power constraints is edge computing ([ 11 ][11]). In this design, computation is moved to the sensing devices themselves, which then transmit filtered or classified results for analysis, greatly reducing transmission requirements. One more trend is the advancement of machine-learning methods ([ 12 ][12]) that can classify and extract patterns from data streams. Much of this technology has been commoditized through intensive development efforts in the technology sector that have resulted in widely available software libraries usable by nonexperts. The aforementioned convolutional neural networks can be coded both to segment data into units and to label these units with appropriate classes. The major bottleneck is in training classifiers because initial training inputs must be labeled manually by experts. Although labeled training sets exist in some domains—most notably, image recognition—future analysts may be able to skip much of the training step as large collections of pretrained networks become available. These pretrained networks can be combined and modified for specific tasks without the requirement of comprehensive training sets. Of particular interest from the standpoint of automation are new developments in continual learning ([ 13 ][13]), in which networks adjust in response to changing inputs. This holds the promise of automating model adaptation for detecting emerging phenomena, such as species shifting their ranges in response to climate change or other shifts in ecosystem properties. Ecologists could leverage these developments to create automated sensing networks at scales previously unimaginable. As an example, consider the North American Breeding Bird Survey, a highly successful citizen-science initiative running since the late 1960s with continental-scale coverage. Expert observers conduct point counts of birds along routes, generating data that have proved invaluable in tracking trends in songbird populations ([ 14 ][14]). Although we hope to see such efforts continue, imagine what could be learned if, instead of sampling these communities once per year, a long-term, continental-scale songbird observatory could be constructed to record and classify bird vocalizations in near–real time along with environmental covariates. Similar networks could use camera traps or video streams to reveal details of diurnal and seasonal variation across diverse floras and faunas. As with all sampling methods, sensing networks will not be without biases in sensitivity and discrimination, yet they hold the extraordinary promise of regional sampling of biodiversity at the organismal scale, something that has proven difficult, for example, by using traditional satellite-based remote sensing. These efforts would complement ongoing development of continental-scale observatories in ecology [for example, ([ 15 ][15])] by increasing the spatial and temporal resolution of sampling. 1. [↵][16]1. S. Díaz et al ., Science 366, eaax3100 (2019). [OpenUrl][17][Abstract/FREE Full Text][18] 2. [↵][19]1. J. Travis , Am. Nat. 196, 1 (2020). [OpenUrl][20] 3. [↵][21]1. M. C. Dietze et al ., Proc. Natl. Acad. Sci. U.S.A. 115, 1424 (2018). [OpenUrl][22][Abstract/FREE Full Text][23] 4. [↵][24]1. B. J. Cardinale et al ., Nature 486, 59 (2012). [OpenUrl][25][CrossRef][26][PubMed][27][Web of Science][28] 5. [↵][29]1. Y. LeCun, 2. Y. Bengio, 3. G. Hinton , Nature 521, 436 (2015). [OpenUrl][30][CrossRef][31][PubMed][32] 6. [↵][33]1. S. S. Sethi et al ., Proc. Natl. Acad. Sci. U.S.A. 117, 17049 (2020). [OpenUrl][34][Abstract/FREE Full Text][35] 7. [↵][36]1. R. C. Whytock et al ., Methods Ecol. Evol. 12, 1080 (2021). [OpenUrl][37] 8. [↵][38]1. B. C. Dhar, 2. N. Y. Lee , Biochip J. 12, 173 (2018). [OpenUrl][39] 9. [↵][40]1. A. P. Hill et al ., Methods Ecol. Evol. 9, 1199 (2018). [OpenUrl][41] 10. [↵][42]1. L. Atzori, 2. A. Iera, 3. G. Morabito , Comput. Netw. 54, 2787 (2010). [OpenUrl][43][CrossRef][44][Web of Science][45] 11. [↵][46]1. W. Shi, 2. J. Cao, 3. Q. Zhang, 4. Y. Li, 5. L. Xu , IEEE Internet Things J. 3, 637 (2016). [OpenUrl][47] 12. [↵][48]1. M. I. Jordan, 2. T. M. Mitchell , Science 349, 255 (2015). [OpenUrl][49][Abstract/FREE Full Text][50] 13. [↵][51]1. R. Aljundi, 2. K. Kelchtermans, 3. T. Tuytelaars , Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2019, pp. 11254–11263. 14. [↵][52]1. J. R. Sauer, 2. W. A. Link, 3. J. E. Fallon, 4. K. L. Pardieck, 5. D. J. Ziolkowski Jr. , N. Am. Fauna 79, 1 (2013). [OpenUrl][53] 15. [↵][54]1. M. Keller, 2. D. S. Schimel, 3. W. W. Hargrove, 4. F. M. Hoffman , Front. Ecol. Environ. 6, 282 (2008). [OpenUrl][55][CrossRef][56] Acknowledgments: Our perspective on autonomous sensing was developed with the support of the Stengl-Wyer Endowment and the Office of the Vice President for Research Bridging Barriers programs at the University of Texas at Austin, and the National Science Foundation (BCS-2009669). Comments from members of the Keitt laboratory, Planet Texas 2050, A. Wolf, and M. Abelson were invaluable in refining our ideas. [1]: #ref-1 [2]: #ref-2 [3]: #ref-3 [4]: #ref-4 [5]: #ref-5 [6]: #ref-6 [7]: #ref-7 [8]: #ref-8 [9]: #ref-9 [10]: #ref-10 [11]: #ref-11 [12]: #ref-12 [13]: #ref-13 [14]: #ref-14 [15]: #ref-15 [16]: #xref-ref-1-1 "View reference 1 in text" [17]: {openurl}?query=rft.jtitle%253DScience%26rft.stitle%253DScience%26rft.aulast%253DDiaz%26rft.auinit1%253DS.%26rft.volume%253D366%26rft.issue%253D6471%26rft.spage%253Deaax3100%26rft.epage%253Deaax3100%26rft.atitle%253DPervasive%2Bhuman-driven%2Bdecline%2Bof%2Blife%2Bon%2BEarth%2Bpoints%2Bto%2Bthe%2Bneed%2Bfor%2Btransformative%2Bchange%26rft_id%253Dinfo%253Adoi%252F10.1126%252Fscience.aax3100%26rft_id%253Dinfo%253Apmid%252F31831642%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [18]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6Mzoic2NpIjtzOjU6InJlc2lkIjtzOjE3OiIzNjYvNjQ3MS9lYWF4MzEwMCI7czo0OiJhdG9tIjtzOjIyOiIvc2NpLzM3My82NTU3Lzg1OC5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30= [19]: #xref-ref-2-1 "View reference 2 in text" [20]: {openurl}?query=rft.jtitle%253DAm.%2BNat.%26rft.volume%253D196%26rft.spage%253D1%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [21]: #xref-ref-3-1 "View reference 3 in text" [22]: {openurl}?query=rft.jtitle%253DProc.%2BNatl.%2BAcad.%2BSci.%2BU.S.A.%26rft_id%253Dinfo%253Adoi%252F10.1073%252Fpnas.1710231115%26rft_id%253Dinfo%253Apmid%252F29382745%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [23]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NDoicG5hcyI7czo1OiJyZXNpZCI7czoxMDoiMTE1LzcvMTQyNCI7czo0OiJhdG9tIjtzOjIyOiIvc2NpLzM3My82NTU3Lzg1OC5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30= [24]: #xref-ref-4-1 "View reference 4 in text" [25]: {openurl}?query=rft.jtitle%253DNature%26rft.stitle%253DNature%26rft.aulast%253DCardinale%26rft.auinit1%253DB.%2BJ.%26rft.volume%253D486%26rft.issue%253D7401%26rft.spage%253D59%26rft.epage%253D67%26rft.atitle%253DBiodiversity%2Bloss%2Band%2Bits%2Bimpact%2Bon%2Bhumanity.%26rft_id%253Dinfo%253Adoi%252F10.1038%252Fnature11148%26rft_id%253Dinfo%253Apmid%252F22678280%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [26]: /lookup/external-ref?access_num=10.1038/nature11148&link_type=DOI [27]: /lookup/external-ref?access_num=22678280&link_type=MED&atom=%2Fsci%2F373%2F6557%2F858.atom [28]: /lookup/external-ref?access_num=000304854000027&link_type=ISI [29]: #xref-ref-5-1 "View reference 5 in text" [30]: {openurl}?query=rft.jtitle%253DNature%26rft.volume%253D521%26rft.spage%253D436%26rft_id%253Dinfo%253Adoi%252F10.1038%252Fnature14539%26rft_id%253Dinfo%253Apmid%252F26017442%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [31]: /lookup/external-ref?access_num=10.1038/nature14539&link_type=DOI [32]: /lookup/external-ref?access_num=26017442&link_type=MED&atom=%2Fsci%2F373%2F6557%2F858.atom [33]: #xref-ref-6-1 "View reference 6 in text" [34]: {openurl}?query=rft.jtitle%253DProc.%2BNatl.%2BAcad.%2BSci.%2BU.S.A.%26rft_id%253Dinfo%253Adoi%252F10.1073%252Fpnas.2004702117%26rft_id%253Dinfo%253Apmid%252F32636258%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [35]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NDoicG5hcyI7czo1OiJyZXNpZCI7czoxMjoiMTE3LzI5LzE3MDQ5IjtzOjQ6ImF0b20iO3M6MjI6Ii9zY2kvMzczLzY1NTcvODU4LmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ== [36]: #xref-ref-7-1 "View reference 7 in text" [37]: {openurl}?query=rft.jtitle%253DMethods%2BEcol.%2BEvol.%26rft.volume%253D12%26rft.spage%253D1080%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [38]: #xref-ref-8-1 "View reference 8 in text" [39]: {openurl}?query=rft.jtitle%253DBiochip%2BJ.%26rft.volume%253D12%26rft.spage%253D173%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [40]: #xref-ref-9-1 "View reference 9 in text" [41]: {openurl}?query=rft.jtitle%253DMethods%2BEcol.%2BEvol.%26rft.volume%253D9%26rft.spage%253D1199%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [42]: #xref-ref-10-1 "View reference 10 in text" [43]: {openurl}?query=rft.jtitle%253DComput.%2BNetw.%26rft.volume%253D54%26rft.spage%253D2787%26rft_id%253Dinfo%253Adoi%252F10.1016%252Fj.comnet.2010.05.010%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [44]: /lookup/external-ref?access_num=10.1016/j.comnet.2010.05.010&link_type=DOI [45]: /lookup/external-ref?access_num=000283039900014&link_type=ISI [46]: #xref-ref-11-1 "View reference 11 in text" [47]: {openurl}?query=rft.jtitle%253DIEEE%2BInternet%2BThings%2BJ.%26rft.volume%253D3%26rft.spage%253D637%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [48]: #xref-ref-12-1 "View reference 12 in text" [49]: {openurl}?query=rft.jtitle%253DScience%26rft.stitle%253DScience%26rft.aulast%253DJordan%26rft.auinit1%253DM.%2BI.%26rft.volume%253D349%26rft.issue%253D6245%26rft.spage%253D255%26rft.epage%253D260%26rft.atitle%253DMachine%2Blearning%253A%2BTrends%252C%2Bperspectives%252C%2Band%2Bprospects%26rft_id%253Dinfo%253Adoi%252F10.1126%252Fscience.aaa8415%26rft_id%253Dinfo%253Apmid%252F26185243%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [50]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6Mzoic2NpIjtzOjU6InJlc2lkIjtzOjEyOiIzNDkvNjI0NS8yNTUiO3M6NDoiYXRvbSI7czoyMjoiL3NjaS8zNzMvNjU1Ny84NTguYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9 [51]: #xref-ref-13-1 "View reference 13 in text" [52]: #xref-ref-14-1 "View reference 14 in text" [53]: {openurl}?query=rft.jtitle%253DN.%2BAm.%2BFauna%26rft.volume%253D79%26rft.spage%253D1%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [54]: #xref-ref-15-1 "View reference 15 in text" [55]: {openurl}?query=rft.jtitle%253DFront.%2BEcol.%2BEnviron.%26rft.volume%253D6%26rft.spage%253D282%26rft.atitle%253DFRONT%2BECOL%2BENVIRON%26rft_id%253Dinfo%253Adoi%252F10.1890%252F1540-9295%25282008%25296%255B282%253AACSFTN%255D2.0.CO%253B2%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [56]: /lookup/external-ref?access_num=10.1890/1540-9295(2008)6[282:ACSFTN]2.0.CO;2&link_type=DOI
- North America > United States > Texas > Travis County > Austin (0.24)
- Asia > Middle East > Jordan (0.24)
- Information Technology > Services (0.55)
- Energy > Renewable (0.55)